Why is the UK government siding with the
Coming up on this special edition of
Talking Politics.
>> It's a bunch of crossroads.
>> Seeing them sitting right there next to
Donald Trump at the inauguration was
pretty alarming.
>> Politics is a funny game. The leadership
thinks that AI is going to save us. Many
people think it will sink us. How the
hell did we let the tech bros get into
this position?
Hello. This year has seen some of the
biggest names in entertainment take on
the UK government over its plans to let
tech firms use their copyrighted work to
train AI. Elton John and Julia may have
stolen the headlines, but the backlash
was led in parliament by the
award-winning film director Ban Kiddron.
even first rose to prominence adapting
Janette Winston's brilliant novel
Oranges are not the only fruit for TV
before a career directing documentaries
and feature films most famously Bridget
Jones the edge of reason she became
Baroness Kedron as a crossbench peer in
2012 and is now globally respected for
her judgments on child safety online and
artificial intelligence serving as an
adviser to the UN and indeed Oxford
University's Institute for Ethics in AI
time and she's very kindly granted us
this exclusive interview. Uh Baroness,
welcome to Talking Politics.
>> Great to be here.
>> Listen, I think we should there's lots
to talk about and we will talk about
your campaign to try and stop AI
companies taking everything they want
without paying for it. But first, I
think a lot of people listening or
watching this will be wondering, okay, I
hear so much discussion about AI. I hear
a lot of debate about what AI is going
to change and potentially what it's not
going to change. Lots of people will be
actually worried about their jobs and
professions and thinking well okay are
we going to be obsolete.
What
perhaps you could just give us an
overview to begin with about what you
think you know all technological
revolutions don't play out the way
people often think do they? They can't
be stopped. That's one lesson from
history. And they don't always play out
the way you think. And we're in the
early stages of this one, but you've
studied it and looked at it in great
detail. What industries do you think it
will most change? What jobs? Where do
you think it'll have the most impact?
And where do you think it's not really
going to have any impact at all?
>> That's a very big question. I like
>> Thanks for that. Look into the future
and tell me what it will be.
>> And we'll have you back in five years to
check whether you were right.
>> Exactly. Well, I I think the first thing
to say is it's going to change a lot of
things a little bit and some things a
lot. And I think that's something that
gets missed. So yes, it's very good at
crunching information and actually
summarizing. And wherever that is a
function, it doesn't matter whether it's
in white collar work or in, you know, in
in in factories or or in the information
sector, you know, it will be good at
summarizing. Will it be good enough is
always the question. And I think that
what we're going to see is a lot of
attempts to replace things and then
finding it can't do the last mile. And I
think if you look at say the history of
automated vehicles, it's actually
they've been able to self-drive for
years. What they can't do is avoid human
beings. So that last
>> feels a bit crucial. feels a bit crucial
and that's why they're not sort of fully
with us now even though they've been on
the cards next year, next year, next
year for at least three or four decades,
you know. So, so I think that what we're
going to see is some incredible things
in medicine, in life scientists
where we can in places where seeing a
bigger pattern and crunching it down to
an essential understanding in those
places it's going to be really impactful
and I think in business you know supply
lines and things like that it's going to
be impactful. it's going to understand
the logic in a way perhaps that is
beyond human logic. Um I think then in
some of the things that are more nuanced
and between people, I'm going to be
really interested to see how it does
that last mile or in fact how people's
jobs change to be the people who check
whether the AI did it properly.
>> Yeah. You spent your career in the
creative industry. So let's talk about
um actually what AI is going to change
whether we like it or not in the
creative industries and obviously every
time any two writers get together they
they they talk about this at the moment
um let alone filmmakers more generally
in the most extreme version you know you
have in the future
you know companies let's say that create
bands and write songs where the bands
aren't real and the songs are made by AI
and the band is made by AI and you you
put it up on YouTube or you put it out
on Spotify and maybe people don't buy
into that or maybe they do. We don't
know. Equally well, you know, it seems
from what you're saying, you know, you
might be able to be
having an idea for a film, let's say,
and you know, you learn to use AI in a
sophistic way. you're a creative and you
make that film not involving real
people, not involving real locations,
not involving a director or anything
else for a much smaller amount than the
film would currently cost. And maybe you
put it up on YouTube already the second
most watched TV channel. You know, you
can see a business model developing like
that. But what is true is it's a
business model for the people who are
making it probably a fraction of the
size that it is now. Is it right to
think that's one possible? Now, I don't
say that everyone will buy into that.
They might not. There might be people
who try it and people just, you know,
won't buy into a film with doesn't have
real people. But do you are we right to
think that that's in one sense for
everyone who's working in the CR that's
the worst case scenario that people do
buy into these things and you can make
incredibly impressive stuff for a tiny
amount of money and the only person who
makes any money is the AI company and
possibly the creative who made it. or is
that exaggerated or unlikely or
unrealistic?
>> I I think it'll I mean I don't know and
I don't have a no one no one knows. So I
think it would be wrong of me to say oh
I I have the answer.
>> I definitely think that that is one
aspect but I look at it slightly
differently. I go human beings are
innately creative. They're creative
about whatever they do. Some traditions
are really craft driven. So you you know
in Japan they do make the most beautiful
bowls and the most beautiful wooden
boxes in a way that we don't bother.
Their paper is nicer than ours you know
etc etc. So you know through time and
through geography creativity has been
always apparent wherever human beings
are and they find a way. So I have this
sort of on the one hand you know I also
am a very particular age which is when I
started making movies it was on cameras
with celluloid and the editor used to
boom boom cut the actual film and we
used to stick it together and so on and
that gave the creator an awful lot of
power. By the time I left the film
business to go into politics, it was all
digital and the producer in LA could
recut your movie while you weren't even
in the room. So, we've seen these moves.
This is a we're on a trajectory. This is
not it's not everything new. It is an
exaggeration, explosion of a journey
we're already on. And there are some
things I really dislike about that
journey. And there are some things that
are fantastic about the journey. And the
question is do we have a as a community
of creatives, as a community of
journalists or even as a political
community, i.e. you know as a nation um
have something to say about which bits
of that journey we would like to see and
which bits we would not like to see. And
I and I think it becomes a political
question, a cultural question. And one
of my biggest worries in it all is that
the AI companies are very very clear
that they have an America first agenda
and they want American culture, American
tools, American business and American
riches. And I think that's something we
have to consider when we start going,
"Oh, I'd like something by Tom Bradbury,
you know, in the in the manner of we
hope.
without involving you, without paying
you, without your permission because
actually what it will be replaced by is
a sort of you know a different kind of
culture. But I think we have to look at
uses and abuses about purposes about
winners about losers and ultimately
creativity is about the human experience
and I think irrespective of AI there
will be people on stages singing playing
dancing
>> I think that's unarguable
>> you know and that will be actually a
godsend the minute that they are solely
relying on AI derived D content. Uh most
people think the models will start to
break. They will need human content. It
will be scarce. It will be valuable. And
us creators will be right there to fill
the gap. So
>> let's talk about your campaign. That's
not I like the optimism. So let's let's
just embrace that. Let's talk about your
campaign because not everyone is, you
know, going to have followed all of
this. Full disclosure, you can, if
you're a writer, you can put your name
into Atlantic magazine in America has
created a database. You can put your
name in there and see whether Meta has
used your books or films or whatever to
train its AI model. And I put my name in
and you know, like every other writer,
it comes up. And I remember doing it and
I was like, I don't know, weirdly quite
shocked. I was like, what? I mean, and
obviously, you know, I'm an impartial
broadcaster, so I need to try and see
both sides here. Part of me is like
outraged. How dare you? And a part of me
thinks, well,
you know, is it the equivalent of a
human being, reading your work along
with thousands of other works, hundreds
of other works and that being part of
their creative inspiration if you like,
you know, they read lots of things and I
mean many things in creative worlds are
derivative. So let's just deal with the
principle first. You can probably tell
by the way I'm articulating this that I
don't wholly believe this, but let's
just give them let's just try and see
the other side. Do you think there's any
value in the AI company's side which is
you know this is a new way of kind of
being and they're training people by
putting in things. Is there any is there
any value in that argument whatsoever?
>> Well, not really.
>> I didn't articulate it very well.
>> No, but I understand. I mean, I think
here's the thing, you know,
in any other business, you pay for your
raw material. Yeah. And if you if you
are making a table, you pay for the
wood, you also pay for the the the
design, you pay for the workmanship, you
know, and you normally pay for it to be
sold, you know. So, there's a whole sort
of chain of events thing. I think that
there's something a little bit more uh
about the principle really which is
actually in human rights law it is a
human right to own your intellectual
artistic scientific output. It is of you
and so you also have a moral right to
your output and you may or may not want
meta to have it. So the first principle
is not really whether they paid for it.
It's that they took it without asking
you. And actually it doesn't really
matter sort of, you know, in in this
sort of rather polarized world that we
live in. You know, if you are sort of to
the conservative side of the equation,
think about it as a property right. And
if you're to the other side of the
equation, think about it as a fair pay
for you know your labor. But whichever
way you look at it, it's yours.
>> Let's just establish where we are and
where we're not. The government's bill
on this subject, which basically
has the clause, as I understand it, that
tech bros can take all our intellectual
copyright. Um, and we have the right to
opt out potentially, but they don't seem
to have clarified how we do that or how
that works. Is that right? Roughly
right.
>> So the the the facts are right. the the
vehicle is wrong. I mean the the truth
is what the government did was they said
we're gonna have an AI bill and what's
more we're gonna have a consultation
about copyright and that's what they put
forward in the copy in the consultation.
I used the data bill which was already
passing through parliament to try and
stop them doing that. So it's a little
bit more complicated.
>> I didn't succeed. Um,
>> so right now, unless they do something
else,
>> we're going down the road of those.
>> No.
>> Okay.
>> Politics is a funny game. So I could
never absolutely succeed because they
have such a big majority. And it is in
the rules of parliament that the House
of Lords always finally gives in uh to
the Commons. What was extraordinary was
how fully meant and felt it was in the
Lord's because we sent it back five
times and that is
>> which is a great deal. Normally we do it
once maybe twice
>> five times because they were not
observing the other rule which is if the
house of laws feels very strongly they
compromise.
>> In the end what they did was they said
okay we won't take opt out. So they have
withdrawn opt out as their preferred
option
>> right
>> uh they did agree to do an impact
assessment. They have agreed to bring
the creative companies on an equal
playing field in all the consultations
to the tech companies which was not the
case previously and they have set up uh
some um groups in parliament uh or they
are in the process of setting up groups
in parliament to make sure that
parliament is heard in their plans. Um,
if you want my cynical view, I think
they're hoping
>> definitely do
>> I think they're planning to kick it into
the long grass and make it someone
else's problem.
>> Have a review.
>> It'll take at least five years and then
everyone will have forgotten about it.
>> I think that's what they're going to do.
And they're going to try and make a
they've made three big that I know of.
I'm sure they've made more. But they're
making a lot of um deals andus with the
with tech companies about government
data about other areas of data meanwhile
and their language is all in on AI. So I
think they've bought in all in on AI. Um
and I'm very happy to say I think
they're naive. I think they made a
mistake. I think they went far too
early. And I will say only this that
since uh we so-called lost in parliament
many many of the AI companies uh of
which you have heard their name have
rung me and said you know what we're
surprised at the government thing we
don't actually have such a binary view
they have oversold
our position and so there is this sort
of incredible irony where the AI
community
particularly here in the UK but actually
globally are saying we do know that we
are eventually going to have to have a
business model otherwise human beings
are not going to contribute anything to
the new world order. We do know that we
can't afford to displace 2.4 four
million people who work in this
industry. And actually we ourselves
um require
you know to have you know to have a an
ongoing relationship with the creative
uh community because creativity,
innovation and entrepreneurship are all
very connected.
>> I tell you what I don't understand. Why
is this a sensible I I understand it's
not your job to tell me what a sensible
position for the government would be. I
mean, you're not a part of the
government and therefore it's not your
job to defend it. But I don't understand
why this is a sense I mean there are
lots of things this government is doing.
I understand don't approve or
disapprove. I just can see why they're
doing it. There's some things they're
doing that I don't understand. But this
falls in the quite far out there
category that I can't understand why
they're doing it. Why are they taking
this position and let me just kind of
throw some stuff into the mix for
discussion. I don't think the tech bros
are the most attractive bunch of people
to the British public in the world,
right? I mean, they're very powerful.
They're quite swaggering. They're quite
arrogant and they don't elicit much
sympathy, I don't think. And seeing them
sitting right there next to Donald Trump
at the inauguration was pretty alarming.
You know, you imagine a kind of British
event of similar kind of importance and
symbolism and seeing, you know, the head
of the BBC and the head of BBC news and
the head of ITV news and all these
people standing, you know, right next to
the person who has, you know, close to
absolute power. I think British people
would find that very disturbing. Which
brings me in a very rambling long way to
my question to you, which is
>> why is this government siding with the
tech bros?
>> Yeah,
>> that makes no sense.
>> Well, yeah, it's
>> have a go at have a go. Have a go. Um,
>> why why why
is our Labor government
backing these guys? And I think it has
three parts. The first part it's it's
they see it as fundamental to their
negotiation with Trump and that tech has
Trump's back. Trump has tech's back. If
they piss off tech, they piss off Trump,
they won't get their deal. I think
that's huge.
>> Yeah. I think the second thing is as
they have been very clear in telling us,
you know, they opened the door, the
cupboard was empty and I think that
sitting in the inbox happened to be the
AI opportunities plan and I literally
think that the our prime minister took
it off the entry and went, "Oh, here's
something hopeful." And I think that and
we've already discussed it in this
conversation. There is so much
overpromising
and so much threatening. If you don't do
this, we go elsewhere. If you don't give
us that, we'll get it in here. We don't
don't don't. And they have literally in
an attempt to balance the books. They've
done some very short-term deals about
data centers which are arguably very bad
for our energy um you know uh uh
network. Um they have done some very
short-term deals about data where they
get a little bit of um sorting out our
data you know in return for the
long-term you know long-term investment
in data that could be you know the oil
fields of the next generation. So I
think they've been short-term, they've
been frightened, they've been bullied,
and you know, and it was all in this
moment where the whole world is in this
um uh sort of tatonic plate movement
partly because Trump came in and said,
"I'm going to recreate the world order."
So it's a bunch of crossroads. What I
think the bigger question is is how the
this position where they have more
money, more resources, more power than
most nation states? And how the hell did
we buy this idea of tech exceptionalism?
Because they are
companies. They are, you know, their
their their companies, their car
companies, their media companies, their
publishers, their shopkeepers, their,
you know, whatever they are. They are
companies making a profit, providing
services, but they have no
responsibility for what they do. And
that lack of responsibility, that lack
of accountability
allows them to avoid regulation, allows
them to avoid sort of public goods. And
in avoiding that, they can grow very
very fast. And there's a sort of a
slightly toxic um circle in which
they pretend that they are the pipes.
they determine the outcomes and now
they're setting the rules and that is
very very dangerous and I think that the
final thing I'll say because the answer
my answer to your question is the
government have made a mistake and they
should row back from that mistake the
government should be much more
sophisticated about tech and they should
listen to vastly more people they listen
to Tony Blair Institute but Tony Blair
Institute has 300 million pounds from
Larry Ellison who happens to own 40% of
the world's data. He is the CEO and
owner of Oracle. Yeah. You know, they
listen to, you know, Google DeepMind.
Well, Google DeepMind are a brilliant
organization, but they have a vested
interest in one side of this equation.
They are too narrow and they don't have
a depth of knowledge. And the leadership
thinks that AI is going to save us. And
I think many many many people who know a
lot more about tech think it will sink
us unless we actually recalibrate and go
you know it needs energy. It needs
talent. Yeah. It needs data. Guess what?
Britain has data. We have brilliant
data. We have creative data that we've
been talking about. We have NHS data. We
have the best CCTV in there. We have
underwater data. We have we have data
coming out of our ears. We got some of
the best collections and the British
Library and the museums. We have a great
deal that they want and we're giving it
instead of selling it
>> or indeed sharing it as a participant in
the AI revolution.
>> What do you think happens next and what
happens next from your point of view?
Where does your focus lie in terms of
trying to persuade the government not to
entirely side with one side of this
argument?
>> I think that what was great about the
campaign, I'm going to do the narrow
first and then but that what was great
about the campaign was that it brought
so many people and so such a broad
you know group of people together. So on
the one hand you know you you have um
you know the great Oscar writers from
the movies and on the other hand you
have you know as you said in your
introduction you know you have Elton
Jalipa Paul McCartney you know I mean I
can't think of anyone who wasn't there
Tom Jones
>> not very popular to go against all those
people I might think but we covered that
already.
>> Yeah. Um but also you had a lot of
people from uh the design sector some of
the best all the things that we don't
know who did it but the our favorite
chairs and the the built environment
they all came in then you know the
libraries and museums and said hey hang
on what about us then the luxury um
brands came in you know Burberry and
Rolls-Royce and so on all their
representatives came in because they all
see this as a sort of a national
problem. We are we are really good at
this stuff. We have a part to play. We
would like to play an outside part
outsiz part in this new world order but
we have to do it not on the base of
thievery. We have to do it on some
equity exchange and that's what they're
looking for. I think that will be
negotiated over time. I think there are
immense problems about it. I'm not sure
it'll be as fair as we would like it to
be, but I think we will see a shift
because I think and I've had enough
conversations that although there are s
there there is a huge range and I would
actually just sort of second what you
say. It's not a binary. There's not the
good and the bad. There is the common
practice and the absolutely wonderful
and the absolutely dreadful. And the
common practice is also problematic. So
I think there's that. I think more
broadly, you know, we have to look
around and say, you know, the UK is not
the only one to bend over. I mean, I
think that the EU, some some member
states were absolutely horrified at the
deal that the EU made with America. I
think that Mark Carney looks like a
hero, you know, amongst modern leaders
for standing up to Trump. Um, and I
think that there will be some things
over the next couple of years that
actually just clip his wings a bit
because it may not work as he wishes it
to work. And if my final word could be
this
>> is it is this exceptionality. It is not
calling things by their name that is the
problem with tech. They are businesses.
They are monopolies. They are you know
they are actually you know they are
actually uh uh products consumer
products without product liability. You
know we have to start calling things
what they know and treating them
accordingly because there is a pathway
forward but everyone wants a silver
bullet and it's not a silver bullet. The
actual pathway is to treat them as they
are. So when they publish, they have
those rules. And when they're consumers,
they have those rules. When they hurt
children, they go to prison. That's what
we need.
>> I found that a fascinating
conversation. So, thank you very, very
much indeed for taking the time to join
us. Let's continue it as the years go
on. Uh we'll be dropping more talking
politics Q&A episodes across the summer
as Robert and I answer your questions.
And if you haven't already hit subscribe
on YouTube or any podcast platform to
make sure you never miss out on an
episode, then please do. But that's it
for now. Goodbye.
[Music]